microsoft azure
Designing CPUs for next-generation supercomputing
While GPU-accelerated AI dominates headlines, CPUs remain an optimal choice for many scientific and engineering workloads. In Seattle, a meteorologist analyzes dynamic atmospheric models to predict the next major storm system. In Stuttgart, an automotive engineer examines crash-test simulations for vehicle safety certification. And in Singapore, a financial analyst simulates portfolio stress tests to hedge against global economic shocks. Each of these professionals--and the consumers, commuters, and investors who depend on their insights-- relies on a time-tested pillar of high-performance computing: the humble CPU. With GPU-powered AI breakthroughs getting the lion's share of press (and investment) in 2025, it is tempting to assume that CPUs are yesterday's news.
- Europe > Germany > Baden-Württemberg > Stuttgart Region > Stuttgart (0.25)
- Asia > Singapore (0.25)
- North America > United States > Massachusetts (0.05)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.37)
Privacy in Responsible AI: Approaches to Facial Recognition from Cloud Providers
As the use of facial recognition technology is expanding in different domains, ensuring its responsible use is gaining more importance. This paper conducts a comprehensive literature review of existing studies on facial recognition technology from the perspective of privacy, which is one of the key Responsible AI principles. Cloud providers, such as Microsoft, AWS, and Google, are at the forefront of delivering facial-related technology services, but their approaches to responsible use of these technologies vary significantly. This paper compares how these cloud giants implement the privacy principle into their facial recognition and detection services. By analysing their approaches, it identifies both common practices and notable differences. The results of this research will be valuable for developers and businesses by providing them insights into best practices of three major companies for integration responsible AI, particularly privacy, into their cloud-based facial recognition technologies.
- North America > United States (1.00)
- Europe > United Kingdom > England > East Yorkshire > Hull (0.14)
- Research Report > New Finding (0.66)
- Research Report > Experimental Study (0.46)
Towards Resource-Efficient Compound AI Systems
Chaudhry, Gohar Irfan, Choukse, Esha, Goiri, Íñigo, Fonseca, Rodrigo, Belay, Adam, Bianchini, Ricardo
Compound AI Systems, integrating multiple interacting components like models, retrievers, and external tools, have emerged as essential for addressing complex AI tasks. However, current implementations suffer from inefficient resource utilization due to tight coupling between application logic and execution details, a disconnect between orchestration and resource management layers, and the perceived exclusiveness between efficiency and quality. We propose a vision for resource-efficient Compound AI Systems through a declarative workflow programming model and an adaptive runtime system for dynamic scheduling and resource-aware decision-making. Decoupling application logic from low-level details exposes levers for the runtime to flexibly configure the execution environment and resources, without compromising on quality. Enabling collaboration between the workflow orchestration and cluster manager enables higher efficiency through better scheduling and resource management. We are building a prototype system, called Murakkab, to realize this vision. Our preliminary evaluation demonstrates speedups up to $\sim 3.4\times$ in workflow completion times while delivering $\sim 4.5\times$ higher energy efficiency, showing promise in optimizing resources and advancing AI system design.
- North America > United States > New York > New York County > New York City (0.05)
- North America > United States > Utah > Salt Lake County > Salt Lake City (0.04)
- North America > United States > Texas > Harris County > Houston (0.04)
- (3 more...)
- Workflow (0.80)
- Research Report (0.52)
Microsoft confirms a CYBERATTACK was behind the latest outage that saw Outlook, Xbox, and Minecraft taken out for almost 10 hours
Microsoft has confirmed that its latest global outage was caused by a malicious cyberattack. The outage saw Outlook email services, Xbox Live, and even Minecraft go down for almost 10 hours yesterday afternoon - just two weeks after millions were affected by global outages. Microsoft now admits that its services were taken out by a Distributed Denial of Service (DDOS) attack which was'amplified' by an error in the company's cyber defences. Experts say the true culprits may never be identified but that they were likely encouraged to strike by Microsoft's recent service troubles. Sylvain Cortes, vice president of strategy at cybersecurity firm Hackuity, told MailOnline: 'Rogue actors, cybergangs, and nation-states alike leverage these tactics, so further investigation is required to determine the origin of the threat.'
- Leisure & Entertainment > Games > Computer Games (1.00)
- Information Technology > Security & Privacy (1.00)
- Government > Military > Cyberwarfare (1.00)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Communications > Social Media (0.76)
- Information Technology > Artificial Intelligence > Games > Computer Games (0.63)
Why does Prediction Accuracy Decrease over Time? Uncertain Positive Learning for Cloud Failure Prediction
Li, Haozhe, Ma, Minghua, Liu, Yudong, Zhao, Pu, Zheng, Lingling, Li, Ze, Dang, Yingnong, Chintalapati, Murali, Rajmohan, Saravan, Lin, Qingwei, Zhang, Dongmei
With the rapid growth of cloud computing, a variety of software services have been deployed in the cloud. To ensure the reliability of cloud services, prior studies focus on failure instance (disk, node, and switch, etc.) prediction. Once the output of prediction is positive, mitigation actions are taken to rapidly resolve the underlying failure. According to our real-world practice in Microsoft Azure, we find that the prediction accuracy may decrease by about 9% after retraining the models. Considering that the mitigation actions may result in uncertain positive instances since they cannot be verified after mitigation, which may introduce more noise while updating the prediction model. To the best of our knowledge, we are the first to identify this Uncertain Positive Learning (UPLearning) issue in the real-world cloud failure prediction scenario. To tackle this problem, we design an Uncertain Positive Learning Risk Estimator (Uptake) approach. Using two real-world datasets of disk failure prediction and conducting node prediction experiments in Microsoft Azure, which is a top-tier cloud provider that serves millions of users, we demonstrate Uptake can significantly improve the failure prediction accuracy by 5% on average.
- North America > United States > District of Columbia > Washington (0.05)
- North America > United States > New York > New York County > New York City (0.04)
- Oceania > Australia > New South Wales > Sydney (0.04)
- (5 more...)
- Information Technology > Data Science > Data Mining (1.00)
- Information Technology > Cloud Computing (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
Microsoft to showcase purpose-built AI infrastructure at NVIDIA GTC
Join Microsoft at NVIDIA GTC, a free online global technology conference (GTC), March 20 to 23 to learn how organizations of any size can power AI innovation with purpose-built cloud infrastructure from Microsoft. Microsoft's Azure AI supercomputing infrastructure is uniquely designed for AI workloads and helps build and train some of the industry's most advanced AI solutions. From data preparation to model and infrastructure performance management, Azure's comprehensive portfolio of powerful and massively scalable GPU-accelerated virtual machines (VMs) and seamless integration with services like Azure Batch and open-source solutions helps streamline management and automation of large AI models and infrastructure. Attend NVIDIA GTC to discover how Azure AI infrastructure optimized for AI performance can deliver speed and scale in the cloud and help you reduce the complexity of building, training, and bringing AI models into production. Don't miss session S52469 featuring Nidhi Chappell, a recipient of the 2023 People to Watch, recognized as a high-performance computing (HPC) luminary by HPCwire.
15 Top-Ranked Online Machine Learning Certificates 2023 - The Fordham Ram
As the field of machine learning continues to grow, there is an increasing demand for professionals with the skills and knowledge to develop and implement machine learning solutions. Whether you're a data scientist, engineer, or business professional looking to expand your skillset, pursuing an online machine learning course can help you stand out in the job market and take your career to the next level. To help you get started, we've compiled a list of 15 top-ranked online machine-learning certificates that you can pursue in 2023. These certificates are offered by reputable institutions and are designed to provide you with the theoretical knowledge and practical skills needed to succeed in the field of machine learning. Check out the 15 best machine learning courses below that will help you cement a spot in this ever-growing industry!
- Instructional Material > Course Syllabus & Notes (1.00)
- Instructional Material > Online (0.82)
- Education > Educational Setting > Online (0.51)
- Education > Educational Technology > Educational Software > Computer Based Training (0.30)
NVIDIA Teams With Microsoft to Build Massive Cloud AI Computer
NVIDIA today announced a multi-year collaboration with Microsoft to build one of the most powerful AI supercomputers in the world, powered by Microsoft Azure's advanced supercomputing infrastructure combined with NVIDIA GPUs, networking and full stack of AI software to help enterprises train, deploy and scale AI, including large, state-of-the-art models. Azure's cloud-based AI supercomputer includes powerful and scalable ND- and NC-series virtual machines optimized for AI distributed training and inference. It is the first public cloud to incorporate NVIDIA's advanced AI stack, adding tens of thousands of NVIDIA A100 and H100 GPUs, NVIDIA Quantum-2 400Gb/s InfiniBand networking and the NVIDIA AI Enterprise software suite to its platform. As part of the collaboration, NVIDIA will utilize Azure's scalable virtual machine instances to research and further accelerate advances in generative AI, a rapidly emerging area of AI in which foundational models like Megatron Turing NLG 530B are the basis for unsupervised, self-learning algorithms to create new text, code, digital images, video or audio. The companies will also collaborate to optimize Microsoft's DeepSpeed deep learning optimization software.
3 ways AI is transforming our world already, including ChatGPT
AI is having a moment. AI – or Artificial Intelligence for long -- has been a buzzword in the tech world for many years. And how it's been used to this point is more behind the scenes, everyday folks are now able to interact with AI and see how quickly it will transform our world. So, let's dive into first what AI is, and then look at three ways it's changing how we think about art, writing, as well as how it works hand-in-hand with machine learning. OK, by now we've all heard the term "AI" thrown around.